CS 174 Lecture 10 John Canny
نویسنده
چکیده
But we already saw that some random variables (e.g. the number of balls in a bin) fall off exponentially with distance from the mean. So Markov and Chebyshev are very poor bounds for those kinds of random variables. The Chernoff bound applies to a class of random variables and does give exponential fall-off of probability with distance from the mean. The critical condition that’s needed for a Chernoff bound is that the random variable be a sum of independent indicator random variables. Since that’s true for balls in bins, Chernoff bounds apply.
منابع مشابه
CS 174 Lecture 3 John Canny
For every permutation where 4 is before 5, there is a matching permutation where 5 is before 4, obtained by swapping 4 and 5. That’s a 1-1 correspondence between the two sets of permutations, so they must have the same size, and partition the set of all permutations into two halves. Since all permutations are equally likely, the total probability for the event (4 is before 5) equals the probabi...
متن کاملCS 174 Spr 99 Lecture 10 Summary
Example 1 Lets consider the integers from 1 to 10,000. The properties are defined as follows: E1 = property that an integer is divisible by 4 E2 = property that an integer is divisible by 5 E3 = property that an integer is divisible by 6 The probabilities of individual divisability are easily computed: Pr [E1] = probability the number is divisible by 4 = 1=4 Pr [E2] = probability the number is ...
متن کامل